A note on extension of sliced average variance estimation to multivariate regression
نویسندگان
چکیده
Rand Corporation, Pittsburgh, PA 15213 e-mail: [email protected] Abstract: Many sufficient dimension reduction methodologies for univariate regression have been extended to multivariate regression. Sliced average variance estimation (SAVE) has the potential to recover more reductive information, and recent development enables us to test the dimension and predictor effects with distributions commonly used in the literature. The main purpose of the paper moves the functionality of SAVE to multivariate regression. For this, three methods are proposed. The asymptotic behaviors ∗Corresponding author
منابع مشابه
On model-free conditional coordinate tests for regressions
Existing model-free tests of the conditional coordinate hypothesis in sufficient dimension reduction (Cook (1998) [3]) focused mainly on the first-order estimation methods such as the sliced inverse regression estimation (Li (1991) [14]). Such testing procedures based on quadratic inference functions are difficult to be extended to second-order sufficient dimension reduction methods such as the...
متن کاملAsymptotics for sliced average variance estimation
In this paper, we systematically study the consistency of sliced average variance estimation (SAVE). The findings reveal that when the response is continuous, the asymptotic behavior of SAVE is rather different from that of sliced inverse regression (SIR). SIR can achieve √ n consistency even when each slice contains only two data points. However, SAVE cannot be √ n consistent and it even turns...
متن کاملLikelihood-based Sufficient Dimension Reduction
We obtain the maximum likelihood estimator of the central subspace under conditional normality of the predictors given the response. Analytically and in simulations we found that our new estimator can preform much better than sliced inverse regression, sliced average variance estimation and directional regression, and that it seems quite robust to deviations from normality.
متن کاملSufficient Dimension Reduction With Missing Predictors
In high-dimensional data analysis, sufficient dimension reduction (SDR) methods are effective in reducing the predictor dimension, while retaining full regression information and imposing no parametric models. However, it is common in high-dimensional data that a subset of predictors may have missing observations. Existing SDR methods resort to the complete-case analysis by removing all the sub...
متن کاملOn the distribution of the left singular vectors of a random matrix and its applications
In several dimension reduction techniques, the original variables are replaced by a smaller number of linear combinations. The coefficients of these linear combinations are typically the elements of the left singular vectors of a random matrix. We derive the asymptotic distribution of the left singular vectors of a random matrix that has a normal limit distribution. This result is then used to ...
متن کامل